|
Hi,
I am using C# 2.0 and I came across a section of code that looks like this:
Regex.Matches(strPassword, @"\W").Count
I don't understand what it is trying to count. Please can some help me out??
Thanks
ma se
|
|
|
|
|
That is counting the number of non-alphanumeric characters in the string, for example %, &, $ etc
|
|
|
|
|
i have a protected web page with a button labeled as mail
.when the user click that mail button ,a table with link is displayed,this table links are word documents.
here when the user click these links,a word document is open,and its address appear in the address bar..
if we copy the address from the bar,close the web site and paste the address in the address bar,the word document is loaded.
i want to prevent the copy and paste.. since these word document are for members only..
am using asp and ms access
if anyone has a solution,please help
thanks you in advance
|
|
|
|
|
Do you have access to the web server the application is being hosted on? There are a few ways which this can be done most good ones require direct access in some manner to the web server.
If you dont the only way I can suggest is to hold the documents in the database and stream them to the client after doing a permission check, but let me know your circumstances and I will go into a bit more detail if you need it.
|
|
|
|
|
hi
thanks for helping me,
i am still developing this website,its the first project am developing after my college,am trying my level best to satisfy them.
am testing my webpage from my localhost only.
am using ms access ,and storing the word document in a folder,found in the inetpub folder,when i run the asp page where the user has its document,the page display a link
,and on click the word document is open.
this document address is displayed in the address bar,
the question come,when i copy this address and close the website,and again paste this add in the address bar,the word document is open,
that the only thing i dont want to happen.
any idea what and how to solve this problem.
thanks 4 helping
|
|
|
|
|
i have a web site where i am using crystal reports for some of my reports.
when the reports are viewed by the client in his browser he is not able to see the reports because he don't have crystal report viewer.so i want to create a button in my page.if user clicks the button.the req DLL have to copy from server to the clinet and i need to register them in his registry.
please can any body tell me the solution how i can do the Job
i am using asp for the web development
Thanks in advance
sharath Chandra
|
|
|
|
|
chandu_shar wrote: the req DLL have to copy from server to the clinet and i need to register them in his registry.
Have a small report setup to download for the client browser.
|
|
|
|
|
i am doing an application in ASP .NET and AJAX...I felt a difficulty in implementing the Ajax Validator CalloutExtender Control.
I am new to AJAX, so someone plz help me recover this....
Thnx in advance......
SAJAN A PILLAI
C#.NET Programmer
TELESOFT INDIA PVT LTD...
BANGALORE
|
|
|
|
|
Get it [^]
Sanjay Kunjam
------------------------------------------------------------------
"Computers are useless. They can only give answers" - Pablo Picasso
------------------------------------------------------------------
|
|
|
|
|
Hi,
There are plenty of articles says that
Do not store objects in Session or Application scope for better performance !!
can anybody tell me
1.what happen when we store the Objects (e.g FSO,DICTIONARY,RecordSet) in session or application ?
2.How the performance get down, if objects are stored in the session scope ?
N.Rajakumar
Application Developer,
|
|
|
|
|
Rajkamal_dfine wrote: Do not store objects in Session or Application scope for better performance !!
It's won't be like this. May be "don't store more large objects in session or application"
Rajkamal_dfine wrote: 1.what happen when we store the Objects (e.g FSO,DICTIONARY,RecordSet) in session or application ?
Assume you have 20000 or more than that user online at a time. All these session objects will be stored in the server memory and finally server will die without getting memory.
|
|
|
|
|
Any good cross-browser compatible ShowModalDialog implemetation other than workarounds like onFocus="this.blur()" and other tricks?
|
|
|
|
|
I have a seen custom made one in forums.asp.net[^] when adding sourcecode in message.
|
|
|
|
|
|
Does anyone know a website where I can ask questions about strange happenings on an Apache Server?
Alternatively, are any of you knowledgeable in the workings of Apache Server that may be able to field questions?
|
|
|
|
|
I don't know much about the internal workings of Apache, but what are the strange happenings you speak of?
"Real programmers just throw a bunch of 1s and 0s at the computer to see what sticks" - Pete O'Hanlon
|
|
|
|
|
Thanks for taking the time to help.
There have been 2 things lately.
First I put together some PERL scripts and when I executed them with a system call "system(script-name agr1 arg2 ... argn)" the script seemed to time out after some undetermined amount of processing. The launched script is a long running script that in turn launches other scripts.
I figured it had something to do with the browser process timing out so I changed the launching script to use exec instead of system thus launching it in a different process and the launched script to be capable of running from the command line. Took out all stdin reads, stdout and stderr writes, logged everything to a file, etc.
Things worked for a few days and I was able to monitor the process via ps and also looking at the logs as the script progressed. As I mentioned, this script launches other scripts as the job progresses and usually two or three of these scripts complete and then the process fails.
I can't seem to find a common failure point as each day the script fails at a different point of completion but I haven't had a successful run of the entire process for about a week now.
I designed the process, when having trouble launching from a browser, so as long as the first script completes I can restart the process from point of failure so I am able to manually complete all the processing if I take it a bit at a time. This seems to rule out a data anomaly but I am thinking there may be a process anomaly that is causing the process to fail when certain state is present. I am looking into making the scripts more bulletproof.
That's the first problem.
The second problem is when I tried to simulate crontab on the cheap. My web hosting package does not support cron and to upgrade to a package that does would be considerably more expensive (US$3.99/mo to US$9.99/mo).
Not to worry I think, I'm a developer, I can launch a process that checks a simulated queue (schedule file), sleeps for a predetermined interval and checks the queue again. After a reasonable amount of time (24 hours) it launches another copy of itself and the launched process sends a kill signal to get rid of the expiring process. That way I don't run into a problem where any process executes for too long and gets killed by the sysadmin thinking it is a stalled process.
So I put together a proof of concept script that does nothing else but write a time stamp message to a log file, sleeps for 1 minute and writes another time stamp message, etc. for 24 hours. This process seems to die after 10 or 15 minutes. Again I can't seem to get a consistent point where it is failing.
In theory I could set it up to live for at most 5 minutes and kick off it's replacement but I would rather solve the problem because I think both problems have a common cause. Hopefully by solving one I will solve both of them.
Any thoughts would be greatly appreciated.
|
|
|
|
|
Been a long time since I've done anything with PERL. I'd look through all of the calling scripts and see where it is conking out. I'd maybe look at just ramping up to 9.99/mo for the ability to run cron jobs, but you need to justify if that is a feasible route to go. Sorry I wasn't of too much help.
"Any sort of work in VB6 is bound to provide several WTF moments." - Christian Graus
|
|
|
|
|
Thanks for looking into it.
Paul Conrad wrote: 'd maybe look at just ramping up to 9.99/mo for the ability to run cron jobs
If I thought it would solve all my problems I would. I could reorganize 2 lower priced hosting packages ($7.98 combined) into one with a certificate ($9.99) which would enable me to update crontab via ssh but that would only solve the crontab simulation problem and I would still have the other, daily backup, script problem.
Thanks again for the help.
|
|
|
|
|
I thought the Apache website itself had a large resource for stuff like that. It's been a few years since I have used Apache though.
_____________________________________________
Flea Market! It's just like...it's just like...A MINI-MALL!
|
|
|
|
|
Hi leckey,
Thanks for replying.
Please read my post to Paul above and if you have any thoughts about how to approach this please let me know.
|
|
|
|
|
Just quickly looking at your problem (I'll try to look at it more tonight at home)I unfortunately don't have an idea off the top of my head. Have you tried moving the entire system to another box to see if it still has the same issues?
Are you using an metabase XML file? There is a value in there called "CGI Timeout" that needs increasing.
-- modified at 17:50 Thursday 12th July, 2007
Also what versions of Perl and Apache are you running?
_____________________________________________
Flea Market! It's just like...it's just like...A MINI-MALL!
|
|
|
|
|
Hi lackey,
Thanks for the insight.
leckey wrote: Are you using an metabase XML file? There is a value in there called "CGI Timeout" that needs increasing.
I will look into that in the morning.
|
|
|
|
|
leckey wrote: There is a value in there called "CGI Timeout" that needs increasing.
Looks like this is what is causing my problems. Here is something I came across when I was looking at CGI script guidelines.
Please optimize your scripts relative to memory usage and <br />
CPU time. Our system automatically stops scripts that use too many system <br />
resources. <br />
<br />
Programming Limits: <br />
CPU : 6 seconds @ 100%<br />
Memory : 10240KB (10MB)<br />
Numerical Processes : 12
The reason the backup scripts were varying widely is that I have a random (within a range) wait programmed into the main loops of all the scripts.
I do this to insure that I don't get scripts executing in lock step hitting resources like a synchronized artillery barrage.
By making the process sleep for a random interval with a range (i.e. randomly between 3 and 15 seconds), I insure that even two simultaneous executions of the same script against different db tables do not synchronize.
In fact two executions of the same script against the same resource (for example a daily backup two days in a row) will take a different amounts of time to complete processing.
That solves the backup script anomaly.
As for the proof of concept scheduler I am thinking that it sometimes got further because of other scripts that were competing for resources at the time of execution.
In any case the limits set down are the ones I need to live with. At least now I know what is at play and can break up the processing accordingly.
Thanks for helping to solve this problem for me. I was at my wits end trying to understand what was going on.
|
|
|
|
|
Glad I could help...especially since I haven't worked with Apache since I started grad school in 2004!
_____________________________________________
Flea Market! It's just like...it's just like...A MINI-MALL!
|
|
|
|